20 research outputs found

    No effect of auditory–visual spatial disparity on temporal recalibration

    Get PDF
    It is known that the brain adaptively recalibrates itself to small (∼100 ms) auditory–visual (AV) temporal asynchronies so as to maintain intersensory temporal coherence. Here we explored whether spatial disparity between a sound and light affects AV temporal recalibration. Participants were exposed to a train of asynchronous AV stimulus pairs (sound-first or light-first) with sounds and lights emanating from either the same or a different location. Following a short exposure phase, participants were tested on an AV temporal order judgement (TOJ) task. Temporal recalibration manifested itself as a shift of subjective simultaneity in the direction of the adapted audiovisual lag. The shift was equally big when exposure and test stimuli were presented from the same or different locations. These results provide strong evidence for the idea that spatial co-localisation is not a necessary constraint for intersensory pairing to occur

    Neural responses in parietal and occipital areas in response to visual events are modulated by prior multisensory stimuli

    Get PDF
    The effect of multi-modal vs uni-modal prior stimuli on the subsequent processing of a simple flash stimulus was studied in the context of the audio-visual 'flash-beep' illusion, in which the number of flashes a person sees is influenced by accompanying beep stimuli. EEG recordings were made while combinations of simple visual and audio-visual stimuli were presented. The experiments found that the electric field strength related to a flash stimulus was stronger when it was preceded by a multi-modal flash/beep stimulus, compared to when it was preceded by another uni-modal flash stimulus. This difference was found to be significant in two distinct timeframes--an early timeframe, from 130-160 ms, and a late timeframe, from 300-320 ms. Source localisation analysis found that the increased activity in the early interval was localised to an area centred on the inferior and superior parietal lobes, whereas the later increase was associated with stronger activity in an area centred on primary and secondary visual cortex, in the occipital lobe. The results suggest that processing of a visual stimulus can be affected by the presence of an immediately prior multisensory event. Relatively long-lasting interactions generated by the initial auditory and visual stimuli altered the processing of a subsequent visual stimulus.status: publishe

    Neural correlates of audiovisual motion capture

    Get PDF
    Visual motion can affect the perceived direction of auditory motion (i.e., audiovisual motion capture). It is debated, though, whether this effect occurs at perceptual or decisional stages. Here, we examined the neural consequences of audiovisual motion capture using the mismatch negativity (MMN), an event-related brain potential reflecting pre-attentive auditory deviance detection. In an auditory-only condition occasional changes in the direction of a moving sound (deviant) elicited an MMN starting around 150 ms. In an audiovisual condition, auditory standards and deviants were synchronized with a visual stimulus that moved in the same direction as the auditory standards. These audiovisual deviants did not evoke an MMN, indicating that visual motion reduced the perceptual difference between sound motion of standards and deviants. The inhibition of the MMN by visual motion provides evidence that auditory and visual motion signals are integrated at early sensory processing stages

    A transition from unimodal to multimodal activations in four sensory modalities in humans: an electrophysiological study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>To investigate the long-latency activities common to all sensory modalities, electroencephalographic responses to auditory (1000 Hz pure tone), tactile (electrical stimulation to the index finger), visual (simple figure of a star), and noxious (intra-epidermal electrical stimulation to the dorsum of the hand) stimuli were recorded from 27 scalp electrodes in 14 healthy volunteers.</p> <p>Results</p> <p>Results of source modeling showed multimodal activations in the anterior part of the cingulate cortex (ACC) and hippocampal region (Hip). The activity in the ACC was biphasic. In all sensory modalities, the first component of ACC activity peaked 30–56 ms later than the peak of the major modality-specific activity, the second component of ACC activity peaked 117–145 ms later than the peak of the first component, and the activity in Hip peaked 43–77 ms later than the second component of ACC activity.</p> <p>Conclusion</p> <p>The temporal sequence of activations through modality-specific and multimodal pathways was similar among all sensory modalities.</p

    Perinatal Asphyxia Affects Rat Auditory Processing: Implications for Auditory Perceptual Impairments in Neurodevelopmental Disorders

    Get PDF
    Perinatal asphyxia, a naturally and commonly occurring risk factor in birthing, represents one of the major causes of neonatal encephalopathy with long term consequences for infants. Here, degraded spectral and temporal responses to sounds were recorded from neurons in the primary auditory cortex (A1) of adult rats exposed to asphyxia at birth. Response onset latencies and durations were increased. Response amplitudes were reduced. Tuning curves were broader. Degraded successive-stimulus masking inhibitory mechanisms were associated with a reduced capability of neurons to follow higher-rate repetitive stimuli. The architecture of peripheral inner ear sensory epithelium was preserved, suggesting that recorded abnormalities can be of central origin. Some implications of these findings for the genesis of language perception deficits or for impaired language expression recorded in developmental disorders, such as autism spectrum disorders, contributed to by perinatal asphyxia, are discussed

    Top-down and bottom-up modulation in processing bimodal face/voice stimuli

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Processing of multimodal information is a critical capacity of the human brain, with classic studies showing bimodal stimulation either facilitating or interfering in perceptual processing. Comparing activity to congruent and incongruent bimodal stimuli can reveal sensory dominance in particular cognitive tasks.</p> <p>Results</p> <p>We investigated audiovisual interactions driven by stimulus properties (bottom-up influences) or by task (top-down influences) on congruent and incongruent simultaneously presented faces and voices while ERPs were recorded. Subjects performed gender categorisation, directing attention either to faces or to voices and also judged whether the face/voice stimuli were congruent in terms of gender. Behaviourally, the unattended modality affected processing in the attended modality: the disruption was greater for attended voices. ERPs revealed top-down modulations of early brain processing (30-100 ms) over unisensory cortices. No effects were found on N170 or VPP, but from 180-230 ms larger right frontal activity was seen for incongruent than congruent stimuli.</p> <p>Conclusions</p> <p>Our data demonstrates that in a gender categorisation task the processing of faces dominate over the processing of voices. Brain activity showed different modulation by top-down and bottom-up information. Top-down influences modulated early brain activity whereas bottom-up interactions occurred relatively late.</p

    Intact spectral but abnormal temporal processing of auditory stimuli in autism.

    Get PDF
    Contains fulltext : 80152.pdf (publisher's version ) (Closed access)The perceptual pattern in autism has been related to either a specific localized processing deficit or a pathway-independent, complexity-specific anomaly. We examined auditory perception in autism using an auditory disembedding task that required spectral and temporal integration. 23 children with high-functioning-autism and 23 matched controls participated. Participants were presented with two-syllable words embedded in various auditory backgrounds (pink noise, moving ripple, amplitude-modulated pink noise, amplitude-modulated moving ripple) to assess speech-in-noise-reception thresholds. The gain in signal perception of pink noise with temporal dips relative to pink noise without temporal dips was smaller in children with autism (p = 0.008). Thus, the autism group was less able to integrate auditory information present in temporal dips in background sound, supporting the complexity-specific perceptual account.9 p

    Looming sounds enhance orientation sensitivity for visual stimuli on the same side as such sounds

    Get PDF
    Several recent multisensory studies show that sounds can influence visual processing. Some visual judgments can be enhanced for visual stimuli near a sound occurring around the same time. A recent TMS study (Romei et al. 2009) indicates looming sounds might influence visual cortex particularly strongly. But unlike most previous behavioral studies of possible audio-visual exogenous effects, TMS phosphene thresholds rather than judgments of external visual stimuli were measured. Moreover, the visual hemifield assessed relative to the hemifield of the sound was not varied. Here, we compared the impact of looming sounds to receding or "static" sounds, using auditory stimuli adapted from Romei et al. (2009), but now assessing any influence on visual orientation discrimination for Gabor patches (well-known to involve early visual cortex) when appearing in the same hemifield as the sound or on the opposite side. The looming sounds that were effective in Romei et al. (2009) enhanced visual orientation sensitivity (d0) here on the side of the sound, but not for the opposite hemifield. This crossmodal, spatially specific effect was stronger for looming than receding or static sounds. Similarly to Romei et al. (2009), the differential effect for looming sounds was eliminated when using white noise rather than structured sounds. Our new results show that looming structured sounds can specifically benefit visual orientation sensitivity in the hemifield of the sound, even when the sound provides no information about visual orientation itself. © The Author(s) 2011
    corecore